The
Protein Power LifePlan_files/0446525766.01.MZZZZZZZ.jpg)
by Michael R. Eades, M.D. and Mary
Dan Eades, M.D.
Chapter 1
Man
the Hunter
The deviation of man from the state in
which he was originally placed by nature seems to have proved to him a
prolific source of diseases.
—Edward Jenner (1749-1823)
In our living room on the coffee table sits
one of our most prized possessions, a fifteen-to-twenty-thousand-year-old
cave-bear skull that we got from Russia. From back of the head to snout
the skull measures almost two feet in length and sports canine teeth that
are three inches long. The entire animal would have been about seven to
eight feet tall and weighed close to a thousand pounds. Examination of
this skull shows a huge ridge running along the top, where the muscles
that worked the jaws were connected. From here they ran along the face and
attached to bony protrusions (called the mandibular ramis) on the lower
jaw. The larger the mandibular ramus, the greater the mass of the muscle
attached to it and the greater the closing force of the jaws. The
mandibular rami of our cave bear are about the size of a child's hand, and
when you compare them to the size of the rami of a human jaw, or even a
dog's jaw, which are both about the size of a dime, you can imagine the
crushing strength in the jaws of this creature.
Cave bears used to roam the fields and
forests of prehistoric Europe, until they were hunted to extinction by
early man. As we gaze at our skull and envision the eight-foot,
thousand-pound beast with the three-inch teeth, the four-inch claws, and
the jaw strength to snap a man in two, we can begin to appreciate how
great our primeval ancestors' need for meat must have been. To think of
this creature, snarling and gnashing its teeth, slashing with giant claws,
charging and roaring, it almost defies imagination that people just like
us went after them with not much more than sharpened sticks. But they did,
and did it so well that cave bears are no more. And we are still here and
carry in our genes this same need for meat that drove our forebears to
brave tooth and claw to get it.
Despite these facts, we still regularly
receive letters that question exactly what kind of diet our ancient
ancestors actually ate. Although in anthropological scientific circles,
there's absolutely no debate about it—every respected authority will
confirm that we were hunters—many people still believe in the
"dangers" of meat eating in light of our supposed vegetarian
past. We've had at least twenty people send us copies of the same table
published in an anti-meat book from the 1970s showing how sundry parts of
our anatomy or physiology are more like those of herbivores than of
carnivores, thus "proving" our vegetarian inclinations. We are,
of course, neither. We're omnivores, able to subsist on meat and
plants—hence the intermediate size of our intestinal tracts. Recently we
received a newsletter clipping quoting a well-known doctor on the subject
of our vegetarian past, as well as an e-mail from a Protein Power
devotee in Italy whose physician had forbidden him to eat meat because it
was "a silent poison." We even had one indignant reader tell us
in no uncertain terms that she was abandoning our program unless we could
answer to her satisfaction the questions that were raised by the quote,
boldly circled in red, in her church bulletin, which she enclosed. The
little blurb pronounced with great authority that the human body was
designed to eat only food of plant origin and that meat
"putrefies" in the human colon, becoming a poison. The physician
from the (as always) prestigious medical school who had made this
statement was someone totally unknown to us, and after a diligent search,
we discovered he had been dead for over a hundred years. Such are the
myths and misconceptions about what we humans were designed to eat.
Our meat-eating heritage—a topic we
thought we'd covered sufficiently in our previous book—is an inescapable
fact. But to be certain that this time we leave no room for doubt, we will
delve back into the issue more deeply and lay out the facts of the matter
so that you'll be armed with the truth and prepared to defend your
nutritional choice with authority.
You'll hear it said, usually by those
espousing vegetarianism for ideological reasons, that primitive tribes
that eat a mainly plant-based diet enjoy better health. For instance, such
authorities frequently cite the lower-than-the-average-American
cholesterol levels of a typical male of the !Kung tribe (a commonly
studied, contemporary chiefly vegetarian hunter-gatherer society) as proof
of the health benefits of meatless living. While it's true that some
predominantly vegetarian hunter-gatherer groups (a minority of such
groups, as we shall see later) have low rates of the "chronic
diseases of affluence," it doesn't necessarily follow that this good
fortune is a result of their diet. Consider the Masai, for example. The
Masai, another intensively studied group of African pastoralists who
subsist mainly on meat, milk, and the blood of the cattle they herd, are
famous and famously studied because of their incredibly low cholesterol
and blood pressure levels even into advanced age despite their enormous
intake of fat. Here we've got two totally diverse diets—the !Kung and
the Masai—and the followers of both have a low incidence of chronic
diseases. Obviously there are other factors at play in the development of
these diseases besides just diet, so let's take a closer look at the
issue.
Anthropologists have known for decades that
the health of humanity took a turn for the worse when our ancestors
abandoned their hunter-gatherer means of subsistence in favor of the farm
somewhere between eight-thousand and ten-thousand years ago. The fossil
record leaves little doubt that compared to their farming successors, the
hunters were more robust, had greater bone density, decreased infant
mortality, a longer life span, a lower incidence of infectious diseases
and iron-deficiency anemia, fewer enamel defects, and little or no tooth
decay.
Humans have followed a Paleolithic diet for
a few million years and a "modern" agricultural diet for only a
few thousand years. The not too gentle forces of natural selection have
spent millennia shaping and molding our evolving line, weeding out those
offshoots and mutations that didn't thrive on the available fare,
reinforcing those traits that improved our survival, until we emerged as
modern humans some one-hundred-thousand years or so ago. Since our modern
form and physiology today is the same as that of these
one-hundred-thousand-year-old ancestors, it stands to reason that we
should function best on the diet they—and we, their descendants—were
designed to eat, not necessarily the "prudent" diet recommended
by modern nutritionists, which is often composed primarily of foods that
weren't even in existence for the vast majority of our time on earth. It
is by turning to the vast amount of anthropological data that we can
determine what our ancestors ate for the three to four million years that
we have been recognizable as humans.
In a Word: Meat
In anthropological research if you follow
the trail of meat consumption, you'll find the history of our earliest
ancestors, because there is no real debate among anthropologists about
early man's history as a meat eater and his evolution into a skilled
hunter; the only debate is about when this hunting ability became fully
developed.
Upon the discovery of the first fossils of
our earliest upright ancestors anthropologists postulated that these
creatures, the australopithecines, and those that followed until the
advent of agriculture was "bloodthirsty, savage" hunters. As
archeologists developed more technologically sophisticated means of
analyzing their collections of bones and tools, thinking drifted from the
idea of early man as hunter to that of early man as scavenger. Gone was
the notion of groups of skilled hunters stalking, bringing down, and
butchering large herbivores; in its place was the vision of groups of
hominids coming upon the kills of large carnivores and stripping the
remaining bits of flesh from the carcasses and using primitive tools to
pummel and break into the cavities of the long bones and skulls to get at
the marrow and brains within. The mainstream archeological and
anthropological view posits that this scavenging lifestyle predominated
until the last one-hundred-thousand years or so, coinciding with the
arrival on the scene of anatomically modern humans. But, thanks to recent
findings, this view is changing—and changing in almost flashback fashion
to the ideas of the earlier anthropologists. Our ancestors from a long,
long way back indeed appear to have been skilled hunters.
New excavations in Boxgrove, England, and
Atapuerca, Spain, reveal that hominids as far back as
five-hundred-thousand or more years ago were exquisitely skilled hunters.
Archeologists at Boxgrove found evidence of numerous kill and/or butcher
sites of extinct horses, rhinoceroses, bear, giant deer, and red
deer—all large mammals requiring a great deal of skill and fortitude to
bring down with primitive implements. Researchers know these animals were
hunted and not just found and scavenged, not only because of the
arrangement of bones at the butcher site, but through microscopic evidence
as well. When analyzed under a microscope, the bones of scavenged
carcasses typically show the cut marks from the tools of the scavengers
lying over the tooth marks of the carnivores that actually made the kill,
indicating that the scavenging came later. At Boxwood, archeologists found
just the opposite. The cut marks from the flint tools on the bones show
evidence that tendons and ligaments were severed to remove muscles from
the bones. The cut marks compare to those produced by today's butchers
using modern tools. In the words of Michael Pitts and Mark Roberts, two of
the primary excavators at Boxgrove, "every animal for which there is
any evidence of interference by the hominids has been carefully, almost
delicately, butchered for the express purpose of consuming the meat."
Further evidence of hunting comes from
several actual wooden spears found throughout Europe that have proven to
be the oldest wooden objects of known use found anywhere in the world.
Archeologists have dated an almost sixteen-inch-long spear tip carved of
yew wood found in 1911 in Clacton, England, to be somewhere between
360,000 and 420,000 years old. Another spear, also made of yew, that is
almost eight feet long and dated to 120,000 years old was found amid the
ribs of an extinct elephant in Lehringen, Germany, in 1948. A few years
ago excavators in a coal mine near Schöninger, Germany, found three
spruce wood spears shaped like modern javelins, the longest of which
measured over seven feet, that proved to be 300,000 to 400,000 years old.
And at one of the butcher sites at Boxgrove, excavators actually found a
fossilized horse scapula that shows what appears to be a spear wound.
The excavation at Boxgrove provided
archeologists with another surprise. It had long been thought that such
stone tools as arrowheads and hand axes, once fashioned, were carried
around by their makers and used as needed, much as we do today with modern
hunting knives and other camp tools. Researchers who have practiced making
prehistoric tools and arrowheads from flint—flint knapping, as it's
called—found the task tedious, difficult, and fraught with the constant
risk that one wrong strike could destroy the tool in the making. As a
result, the thinking was that the effort put into making quality stone
tools was so great that the makers would surely value them and keep them
as long as they could. Amazingly, it appears from the meticulous
examination of these ancient sites that these hominid hunters were so
adept at making flint tools for butchery that they knocked them off on the
spot, used them to skillfully dismember their prey, and left them at the
site rather than carry them around. And these weren't just crude flint
chips; these were some of the finest flint hand axes ever found. Modern
attempts to reproduce the quality of these tools have usually fallen far
short of the mark. Obviously these ancient hominids were skilled enough to
whip out a flawlessly made butchering tool at a moment's notice, a fact
that implies a lifetime of hunting, butchering, and meat consumption.
We know from these European sites that
hominids were actively hunting and eating meat as far back as
five-hundred-thousand years ago, but what about before that? The earliest
stone tools date to around 2.6 million years ago and have been found in
association with extinct animals' bones from the same period. Some of
these have cut marks with overlying carnivore teeth marks, indicating
hunting, while others have carnivore teeth marks with overlying cut marks,
implying scavenging. The most probable conclusion is that protohumans back
at least 2.6 million years ago—a time corresponding to the appearance of
the genus Homo—were engaged in the consumption of meat by either
scavenging or hunting activities and probably a combination of the two.
Prior to 2.6 million years ago the human
line was represented by australopithecines, which have been believed to be
primarily fleshy fruit eaters. So, it was thought, the human line
developed the taste for meat sometime between the plant-eating
australopithecines and the appearance of Homo, but even that time
frame has now been pushed back. Anthropologists Matt Sponheimer and Julia
Lee-Thorp from Rutgers University and the University of Cape Town,
respectively, performed an ingenious analysis on the remains of four
three-million-year-old Australopithecus africanus specimens found
in a cave in South Africa. Bones of this age are always fossilized, thus
preventing researchers from extracting living material from them for
analysis, but not so for the tooth enamel; tooth enamel persists
relatively unchanged through the millenia and lends itself to testing for
organic content. Whatever is incorporated into the developing enamel stays
there—in this case for three million years. By testing for variations in
the carbon atoms making up the tooth enamel researchers can determine what
the owner of the tooth ate because different food sources contain specific
carbon isotopes. When Sponheimer and Lee-Thorp analyzed the
australopithecine enamel for the content of Carbon-13, a heavy isotope
typically found in grasses and in the flesh of grass-eating animals, they
found plentiful amounts, indicating that these hominids ate either a fair
amount of grass or grass-eating animals or both. Analysis of the surfaces
of the teeth, however, didn't show the specific scratches that are the
telltale signs of grass eaters, leading the researchers to conclude that
australopithecines at least as far back as three million years ate meat.
We have evidence tracking back three
million years for meat eating by our ancestors and at least a
five-hundred-thousand-year history of skillful hunting. In terms of
generations this means that we modern humans are the result of
one-hundred-fifty-thousand generations of meat eating,
twenty-five-thousand generations of skilled hunting, but only a mere
four-hundred to five-hundred generations of agriculture. Since geneticists
calculate that it takes at least two-thousand generations for even minimal
changes to be manifest, it should be apparent that eons of meat eating
forged our physiology and metabolism to respond optimally on a diet
containing significant amounts of meat. A low-fat, high-carbohydrate diet,
the real fad diet in evolutionary terms, limits the consumption of the
meat we were designed by nature to eat and replaces it with starchy foods
that our bodies haven't had the time to adapt to. It's no wonder the
low-fat diet wasn't what it was cracked up to be. It's far too new for our
bodies to know what to do with.
Brain Food
Not only was meat a principal source of
nutrition for developing man, it actually was the driving force allowing
us to develop our large brains. For years anthropologists argued that we
humans got our large brains because we had to develop them to learn
hunting strategies to capture and kill game much larger, faster, and
meaner than ourselves. Anthropologists Leslie Aiello and Peter Wheeler
turned that idea on its head in a brilliant paper postulating that we were
able to develop our large brains not to learn to hunt but because the
fruits of our hunting—nutrient-dense meat—allowed us to decrease the
size of our digestive tracts. The more nutrient dense the food, the less
digestion it needs to extract the nutrients, and consequently the smaller
the digestive tract required. (The human digestive tract, while longer
than true carnivores, is the shortest of any of the primates.)
Is meat really that nutritionally dense?
Let's take a look at a few examples of meat compared to plant foods and
see. First, let's look at protein. Protein is the only true essential
macronutrient. Fat is also essential, but you can go a lot longer without
fat than you can without protein. (Carbohydrates, the third macronutrient,
are totally unessential to human health.) So, if you are trying to get
protein you could eat 8 ounces of elk meat, a small amount by Paleolithic
standards, and get about 65 grams of it. Or you could eat almost 13 heads
of lettuce to get the same amount. Or 56 bananas or 261 apples or even 33
slices of bread. If you're trying to get methionine, an essential amino
acid that the body uses to make glutathione, its major antioxidant, you
could eat the same 8 ounces of elk, or you could eat any of the following:
22 heads of lettuce, 127 bananas, 550 apples, or 46 slices of bread. In
almost any nutrient category you want to look at, meat is going to come
out a winner because of its incredible nutritional richness that doesn't
require much digestive activity to get to.
Table 1.1 shows the difference between the
digestive tract of a sheep, which is a true herbivore, and a dog, which is
primarily a carnivore, and a human. Let's take a look and see where our
species falls in the spectrum from carnivorous to vegetarian traits.
But What If I'm a Vegetarian?
A larger percentage of our patients than
you might imagine are vegetarian to some degree. With some modifications,
the Protein Power LifePlan works fine for vegetarians, but before
we start patients on the vegetarian version we always inquire as to their
rationale for following such a diet. If they are vegetarians because they
believe it a more healthy way to eat, we disabuse them of that notion
quickly. If, on the other hand, they are vegetarians for ideological
reasons, we have no quarrel with that and we help them modify our program
to solve their health problems within the limits of their ideology. We do,
however, encourage them to read a fascinating little book entitled The
Covenant of the Wild that goes a long way toward removing many of the
inhibitions that some people have about using animals for food.
Were We Hunter-Gatherers or
Gatherer-Hunters?
What about the gathering that went along
with the hunting? Don't we have a history of a fair amount of plant
consumption along with our meat eating? How about the ancient potatoes
that went along with our mastodon steak? Until the advent of fire about
five hundred thousand years ago, it was fairly difficult for our
predecessors to get enough calories from plant foods because the plants
themselves fought back by evolving anti-nutrients. Anti-nutrients are
chemicals within the plants that bind with the nutrients, making them
unavailable for absorption by potential herbivorous predators. (See
chapter 6, "The Leaky Gut: Diet and the Autoimmune Response,"
for more details.) Often we lose sight of the fact that, like humans and
other species, plants evolve, too. The inner goal of plants is to live
long, prosper, and disseminate as many seeds as possible in order to
propagate the species. If a particular plant is tasty and easy to harvest
(we're talking about plants in the wild, not hybrid plants that we put in
gardens today), it doesn't last long and certainly doesn't get much of a
chance to spread its seeds. Plants, however, that develop (via natural
selection) a means to keep from being eaten, whether by growing protective
thorns or stickers, acquiring a particularly nasty taste, or producing
anti-nutrients, survive to reproduce and multiply. The variety of plant
foods available to the vast majority of evolving humans simply wasn't
enough to nourish them without a generous amount of meat in the diet. In
fact, Cambridge anthropologist Robert Foley says that hunter-gatherers "along
with modern agriculturalists . . . are an evolutionarily derived form that
appeared towards the end of the Pleistocene [ten thousand or so years ago]
as a response to changing resource conditions." In other words,
according to Dr. Foley, gathering, like agriculture, is a recent
phenomenon, not a lifestyle that has its roots in several million years of
evolution. That said, it's interesting to find, however, that
hunter-gatherers (low-fat proponents always want to call them
gatherer-hunters) are primarily meat eaters.
Most of the commonly accepted information
about hunter-gatherers comes from a paper by R. B. Lee that was presented
at a 1968 symposium in Chicago called, strangely enough considering the
data presented, "Man the Hunter." Using the 1967 edition of
Murdock's Ethnographic Atlas, a compilation of data about 862 of
the world's societies, Lee concluded that the average hunter-gatherer got
about 65 percent of his calories from plants and the remaining 35 percent
from animals. This paper with its 65:35 plant-to-animal-food ratio has
been quoted extensively in both the medical and the anthropological
literature and used as the basis for the calculations of the prehistoric
diet by innumerable authors who have promoted the idea that the diet of
evolving man was mainly plant based. Unfortunately it is incorrect.
A colleague and good friend of ours, Loren
Cordain, Ph.D., professor at Colorado State University, one of the world's
experts on the Paleolithic diet, and one of the most industrious human
beings we've ever known, sensed that there was something not quite right
about Lee's paper and decided to investigate the data himself. Dr.
Cordain's first clue that something was amiss was unbelievably basic and
had been overlooked by all the researchers who had used Lee's paper as the
basis of their own work. He simply ran a computerized nutritional analysis
of a typical hunter-gatherer diet using the 65:35 plant-to-animal-food
ratio. He discovered that for a human to get the calories needed to live
on a diet of this nature using plants commonly available to a
hunter-gatherer, he would have to gather approximately twelve pounds of
vegetation daily, an unlikely scenario, to say the least.
After making this discovery, Dr. Cordain
reviewed Lee's original paper and calculations and unearthed some
startling facts. Lee only used 58 of the 181 hunter-gatherer societies
listed, and he didn't include animal foods obtained from fishing in his
calculations. Moreover, he classified the collection and consumption of
shellfish as a gathering activity. The Ethnographic Atlas itself
considers the collection and consumption of small land fauna (insects,
invertebrates, small mammals, amphibians, and reptiles) gathering and
categorizes them as such, in so doing ascribing many of the actual
animal-derived calories to the plant category.
Dr. Cordain turned to the 1997 update of
the Ethnographic Atlas, which represents 1,267 of the world's
societies, 229 of which are hunter-gatherers, and did his own
calculations. Using all the hunter-gatherer societies listed and putting
fishing and shellfish gathering into the appropriate hunter category, he
found that the 65:35 values of Lee were flipped. Dr. Cordain calculated
the actual plant-to-animal-food ratio to be 35 percent plant, 65 percent
animal. He found that the majority of hunter-gatherers throughout the
world get over half their subsistence from animal foods, while only 13.5
percent of the world's hunter-gatherers derive more than half their food
from gathering plants. And these figures would lean even more in the
direction of animal food were it not for the bias built into even the
updated Ethnographic Atlas by the inclusion of small animals,
reptiles, worms, grubs, etc., in the plant category.
Our primitive ancestors, whether hunters or
hunter-gatherers, by all accounts lived fairly prosperous lives, at least
by their standards. They lived in small, closely knit groups, and compared
to the early farmers that followed them, they had much better health,
greater stature, more children reaching maturity, and a longer life span.
Turning to an agricultural existence forced the reliance on fewer numbers
of foods, and since no single plant food provides a full complement of all
the nutrients humans need, many people suffered nutritional deficiencies.
And if the crop failed, famine set in—an experience foreign to most of
the hunter-gatherer populations because they were always on the move,
traveling to where there were plenty of game and fertile fields for
gathering. A system in which large groups of people lived in close
proximity, at least where early man was concerned, wasn't really all that
advantageous. Most of the infectious diseases that have caused so much
misery throughout history—smallpox, cholera, tuberculosis, and a host of
other bacterial and viral infections—became problems only after the
advent of the agriculture and the development of cities. All this begs the
question, why did humans ever settle down and become civilized? Why did
they leave their Garden of Eden, give up their hunting jobs requiring only
a few hours of work per day, and submit to the backbreaking toil of an
agricultural life? It just doesn't make sense.
This question has been pondered ever since
anthropologists figured out that humans made this transition, and, as you
might expect, almost as many hypotheses have been forwarded as there are
anthropologists. Greg Wadley and Angus Martin, researchers at the
University of Melbourne in Australia have put forth an engaging theory
that makes a lot of sense to us. They point out that there exists a
considerable amount of research establishing the fact that cereal grains,
especially wheat, maize, and barley and, to a slight extent, dairy
products contain opioid substances called exorphins. Opioid substances are
those that have an opium-like effect, stimulate the opioid receptors in
the brain, and are to varying degrees addictive. When bands of primitive
people stumbled onto patches of wild grains and consumed them they
discovered the reward from consuming "addictive" substances,
i.e., comfort foods. People quickly developed ways of making these foods
even more edible by grinding and cooking them. As the grains become more
palatable through processing, the more they were consumed and the more
important the exorphin reward became.
In the words of Wadley and Martin, "At
first, patches of wild cereals were protected and harvested. Later, land
was cleared and seeds were planted and tended, to increase quantity and
reliability of supply. Exorphins attracted people to settle around cereal
patches, abandoning their nomadic lifestyle, and allowed them to display
tolerance instead of aggression as population densities rose in these new
conditions." According to these researchers, then, grains were the
first opiate of the masses!
Whether this theory is the correct one or
not, there is no question in our minds that carbohydrate foods cause
cravings and are, to a certain degree, addictive, particularly those of
cereal grain origin. If you look at any list of the top ten foods consumed
by Americans you will find bread, crackers, chips, breakfast cereals, and
other high-carbohydrate, grain-based products. We have all experienced the
addictive nature of carbohydrates and their ability to override the
feeling of fullness. Think back to the last time you were at a restaurant
or at someone's house for dinner and you ate until you were stuffed. If
one of your dinner mates asked you to try just a bite of the delicious
swordfish (or any other meat dish), you no doubt begged off, saying,
"I'm just too full; I couldn't possibly eat another bite." But
then, if your host or your waiter arrived bearing dessert, you probably
said, "Oh, well, dessert, sure. I'll have some cake"—or ice
cream, or tiramisu, or cobber, or whatever. You are able to eat the
dessert, which is always rich in carbohydrates, because just the thought
of the carbohydrates overrides your brain signals telling you that you're
full. Carbohydrates seem to trigger no off switch. That's why people who
binge always do so on carbohydrates. No one binges on steak or eggs or
pork chops; they always binge on cookies and candies and other
carbohydrate junk foods. Having taken care of as many carbohydrate junkies
as we have over the past fifteen years, it is clear to us that cereal
grains and products made from them have an allure that transcends the mere
taste bud stimulation they provoke. As Wadley and Martin point out,
"The ingestion of cereals and milk, in normal modern dietary amounts
by normal humans, activates reward centres in the brain. Foods that were
common in the diet before agriculture . . . do not have this
pharmacological property. The effects of exorphins are qualitatively the
same as those produced by other opioid . . . drugs, that is, reward,
motivation, reduction of anxiety, a sense of well-being [i.e., comfort
foods], and perhaps even addiction. Though the effects of a typical meal
are quantitatively less than those of doses of those drugs, most modern
humans experience them several times a day, every day of their adult
lives."
It should be clear by now that whichever
way you look at it, the majority of our time as humans or our
sort-of-human predecessors on this earth has been spent eating meat. The
adoption of agriculture with its dependence on a grain-based diet is a
recent phenomenon, in fact just a second in evolutionary time. The forces
of natural selection haven't yet had anywhere near the time necessary to
mold us to function optimally on a grain-based diet. We are still
operating with forty-thousand-to-one-hundred-thousand-year-old
biochemistry and physiology. Geneticists have evaluated the DNA sequences
of humans and our closest relatives, the chimpanzee, and found the
difference to be a mere 1.6 percent of genes, meaning we have 98.4 percent
of genes in common with chimpanzees. By determining the rate of genetic
change since we split away from chimpanzees, scientists have been able to
calculate the rate of genetic mutation in humans, which turns out to be on
the order of about a half a percent per million years. That means that
over the past ten thousand years—the time since the advent of
agriculture—we have changed genetically to the tune of about 0.005
percent. That's not much at all. In fact, that means that we have 99.995
percent of our genes identical with those of our big game-hunting
ancestors. We are they. We have Fred Flintstone bodies living in a George
Jetson world. And therein lies the root of our problems.
In our medical/nutritional practice we view
modern diseases in our patients through the lens of their Paleolithic
ancestry and use the Paleolithic diet and lifestyle with some
twentieth-century modifications as a template to restore their health.
(Throughout this book, we'll hold up that lens to the Paleolithic world to
give you a look at where and how your modern lifestyle and diet may
conflict with it.) We care for patients who have heart disease, elevated
cholesterol and triglyceride levels, diabetes, obesity, high blood
pressure, gastroesophageal reflux, various autoimmune disorders, and a
number of other problems by using a protein-based diet containing a fair
amount of meat. Patients are constantly amazed at how quickly they improve
and often believe that it is nothing short of miraculous. The reality is
that we are just getting them to follow a diet they were intended to eat.
We were designed to function optimally on a particular diet, we stray from
this diet, we develop disease, we return to the correct diet, and the
disease disappears. It's basically as simple as that.
One of the primary ways in which a
Paleolithic nutritional regimen works to resolve these problems is by
lowering insulin levels. Virtually every food our prehistoric ancestors
had available (with the exception of honey) is one that doesn't stimulate
the body to produce much insulin, whereas the vast majority of foods we
eat in today's world do just the opposite and send insulin levels through
the roof. In the next chapter we'll take a look at this most powerful of
our metabolic hormones and learn the havoc it can wreak when we stray from
our ancestral bill of fare.
BOTTOM LINE
The overwhelming mass of scientific
evidence supports the notion that for most of our time on earth, humans
and their pre-human ancestors have eaten meat. By all reputable scientific
accounts, we've been hunting and gathering (with heavy reliance on the
hunting) for the better part of three million years. Eons of natural
selection and human development molded our metabolic machinery to succeed
on this ancient dietary scheme that appears to have included about 65
percent foods of animal origin and about 35 percent foods of plant origin.
Only about ten thousand years ago (at most) did we settle down to
cultivate grains and begin to include them as food in our diets. The
metabolic changes necessary for humans to adapt to this dietary
change—in short, to be able to use these "new" foods
well—would reasonably take a few thousand generations (or about forty
thousand or fifty thousand years). We're simply not there yet—and won't
be anytime soon.
Turning to the use of grains allowed humans
to settle in large cooperative groups necessary to build great
civilizations, but at a price to the individual members of the group.
While we can subsist on grain-based diets, we don't as a species thrive on
them; the fossil record shows that after the adoption of agriculture human
health, stature, and longevity went into sharp decline. In the last
century in the Western world, thanks to a general increase in dietary
protein, we've begun to recover our stature, but because of our continued
heavy reliance on cereal grains, metabolic health still lags. We're
riddled as a society with epidemics of diabetes, high blood pressure,
heart disease, and obesity, all of which we inherited when our ancient
ancestors abandoned their successful hunting-and-gathering lifestyle in
favor of the addictive lure of grains (components of which indeed do
stimulate the narcotic centers of the human brain).
In our medical/nutritional practice, we
care for people with all components of this epidemic of modern diseases.
To restore their health, we advocate a return to the basic nutritional
principles of our ancestral hunting-gathering lifestyle by prescribing a
diet of nutrient-dense foods—meat, fish, and poultry, rich in protein
and good-quality essential fats; fruits, berries, and vegetables, rich in
antioxidants and cancer-fighting substances—and limiting what early
humans never knew existed—grains, refined sugars, and other concentrated
starches.
© 1999 by Michael R. Eades and Mary Dan
Eades
Excerpt posted with permission from http://www.twbookmark.com
Many thanks to Time Warner
Bookmark (Little, Brown & Company, Warner Books, A Time Warner
Company) at: www.twbookmark.com.
We appreciate their cooperation with OfSpirit.com to share this chapter of
their book with our visitors for education, entertainment and
empowerment.